Rule Based System for Recognizing Emotions Using Multimodal Approach

نویسنده

  • Preeti Khanna
چکیده

Emotion is assuming increasing importance in human computer interaction (HCI), in general, with the growing feeling that emotion is central to human communication and intelligence. Users expect not just functionality as a factor of usability, but experiences, matched to their expectations, emotional states, and interaction goals. Endowing computers with this kind of intelligence for HCI is a complex task. It becomes more complex with the fact that the interaction of humans with their environment (including other humans) is naturally multimodal. In reality, one uses a combination of modalities and they are not treated independently. In an attempt to render HCI more similar to human-human communication and enhance its naturalness, research on multiple modalities of human expressions has seen ongoing progress in the past few years. As compared to unimodal approaches, various problems arise in case of multimodal emotion recognition especially concerning fusion architecture of multimodal information. In this paper we will be proposing a rule based hybrid approach to combine the information from various sources for recognizing the target emotions. The results presented in this paper shows that it is feasible to recognize human affective states with a reasonable accuracy by combining the modalities together using rule based system. Keywords—Human Computer Interaction (HCI); Multimodal emotion recognition; Rule based system; Emotional state; Modalities.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bimodal Emotion Recognition: A Comparative Study of Rule Based System Vs Classification Algorithms

Emotions can be understood in a face to face interaction immediately while in a human computer interaction (HCI) this response is limited. Research studies have been undertaken to investigate and develop various approaches and technology to incorporate emotions in HCI. One major concern of HCI now is the need to improve the interactions between humans and computers through justifications and ex...

متن کامل

MAUI: a Multimodal Affective User Interface Sensing User’s Emotions based on Appraisal Theory - Questions about Facial Expressions..

We are developing a Multimodal Affective User Interface (MAUI) framework shown in Figure 1 and described in [5], aimed at recognizing its users emotions by sensing their various user-centered modalities (or modes), and at giving the users context-aware feedback via an intelligent affective agent by using different agent-centered modes. The agent is built on an adaptive system architecture which...

متن کامل

A New Statistical Approach for Recognizing and Classifying Patterns of Control Charts (RESEARCH NOTE)

Control chart pattern (CCP) recognition techniques are widely used to identify the potential process problems in modern industries. Recently, artificial neural network (ANN) –based techniques are very popular to recognize CCPs. However, finding the suitable architecture of an ANN-based CCP recognizer and its training process are time consuming and tedious. In addition, because of the black box ...

متن کامل

Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals

We discuss the strong relationship between affect and cognition and the importance of emotions in multimodal human computer interaction (HCI) and user modeling. We introduce the overall paradigm for our multimodal system that aims at recognizing its users’ emotions and at responding to them accordingly depending upon the current context or application. We then describe the design of the emotion...

متن کامل

MEMN: Multimodal Emotional Memory Network for Emotion Recognition in Dyadic Conversational Videos

Multimodal emotion recognition is a developing field of research which aims at detecting emotions in videos. For conversational videos, current methods mostly ignore the role of inter-speaker dependency relations while classifying emotions. In this paper, we address recognizing utterance-level emotions in dyadic conversations. We propose a deep neural framework, termed Multimodal Emotional Memo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013